We propose a general algorithmic framework for constrained matrix and tensorfactorization, which is widely used in signal processing and machine learning.The new framework is a hybrid between alternating optimization (AO) and thealternating direction method of multipliers (ADMM): each matrix factor isupdated in turn, using ADMM, hence the name AO-ADMM. This combination cannaturally accommodate a great variety of constraints on the factor matrices,and almost all possible loss measures for the fitting. Computation caching andwarm start strategies are used to ensure that each update is evaluatedefficiently, while the outer AO framework exploits recent developments in blockcoordinate descent (BCD)-type methods which help ensure that every limit pointis a stationary point, as well as faster and more robust convergence inpractice. Three special cases are studied in detail: non-negative matrix/tensorfactorization, constrained matrix/tensor completion, and dictionary learning.Extensive simulations and experiments with real data are used to showcase theeffectiveness and broad applicability of the proposed framework.
展开▼